Canadian Security Magazine

UK cybersecurity center says ‘deepfakes’ and other AI tools pose a threat to the next election

By The Associated Press   

News ai cybersecurity deepfake elections u.k.

LONDON (AP) — Britain’s cybersecurity agency said Tuesday that artificial intelligence poses a threat to the country’s next national election, and cyberattacks by hostile countries and their proxies are proliferating and getting harder to track.

The National Cyber Security Center said “this year has seen the emergence of state-aligned actors as a new cyber threat to critical national infrastructure” such as power, water and internet networks.

The center — part of Britain’s cyberespionage agency, GCHQ — said in its annual review that the past year also has seen “the emergence of a new class of cyber adversary in the form of state-aligned actors, who are often sympathetic to Russia’s further invasion of Ukraine and are ideologically, rather than financially, motivated.”

It said states and state-aligned groups pose “an enduring and significant threat,” from Russian-language criminals targeting British firms with ransomware attacks, to “China state-affiliated cyber actors” using their skills to pursue “strategic objectives which threaten the security and stability of U.K. interests.”

Advertisement

Echoing warnings by Britain’s MI5 and MI6 intelligence agencies, the center called the rise of China as a tech superpower “an epoch-defining challenge for U.K. security.”

“We risk China becoming the predominant power in cyberspace if our efforts to raise resilience and develop our capabilities do not keep pace,” it said.

The report also highlighted the threat posed by fast-evolving AI technology to elections, including a U.K. national election due to be held by January 2025.

While Britain’s old-fashioned method of voting, with pencil and paper, makes it hard for hackers to disrupt the vote itself, the center said deepfake videos and “hyper-realistic bots” would make the spread of disinformation during a campaign easier.


Print this page

Advertisement

Stories continue below


Related

Leave a Reply

Your email address will not be published. Required fields are marked *

*